A Robust Accelerated Optimization Algorithm for Strongly Convex Functions
نویسندگان
چکیده
This work proposes an accelerated first-order algorithm we call the Robust Momentum Method for optimizing smooth strongly convex functions. The algorithm has a single scalar parameter that can be tuned to trade off robustness to gradient noise versus worst-case convergence rate. At one extreme, the algorithm is faster than Nesterov’s Fast Gradient Method by a constant factor but more fragile to noise. At the other extreme, the algorithm reduces to the Gradient Method and is very robust to noise. The algorithm design technique is inspired by methods from classical control theory and the resulting algorithm has a simple analytical form. Algorithm performance is verified on a series of numerical simulations.
منابع مشابه
An algorithm for approximating nondominated points of convex multiobjective optimization problems
In this paper, we present an algorithm for generating approximate nondominated points of a multiobjective optimization problem (MOP), where the constraints and the objective functions are convex. We provide outer and inner approximations of nondominated points and prove that inner approximations provide a set of approximate weakly nondominated points. The proposed algorithm can be appl...
متن کاملAccelerating Asynchronous Algorithms for Convex Optimization by Momentum Compensation
Asynchronous algorithms have attracted much attention recently due to the crucial demands on solving large-scale optimization problems. However, the accelerated versions of asynchronous algorithms are rarely studied. In this paper, we propose the “momentum compensation” technique to accelerate asynchronous algorithms for convex problems. Specifically, we first accelerate the plain Asynchronous ...
متن کاملOn the quadratic support of strongly convex functions
In this paper, we first introduce the notion of $c$-affine functions for $c> 0$. Then we deal with some properties of strongly convex functions in real inner product spaces by using a quadratic support function at each point which is $c$-affine. Moreover, a Hyers–-Ulam stability result for strongly convex functions is shown.
متن کاملAn Interior Point Algorithm for Solving Convex Quadratic Semidefinite Optimization Problems Using a New Kernel Function
In this paper, we consider convex quadratic semidefinite optimization problems and provide a primal-dual Interior Point Method (IPM) based on a new kernel function with a trigonometric barrier term. Iteration complexity of the algorithm is analyzed using some easy to check and mild conditions. Although our proposed kernel function is neither a Self-Regular (SR) fun...
متن کاملDistributed Accelerated Proximal Coordinate Gradient Methods
We develop a general accelerated proximal coordinate descent algorithm in distributed settings (DisAPCG) for the optimization problem that minimizes the sum of two convex functions: the first part f is smooth with a gradient oracle, and the other one Ψ is separable with respect to blocks of coordinate and has a simple known structure (e.g., L1 norm). Our algorithm gets new accelerated convergen...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1710.04753 شماره
صفحات -
تاریخ انتشار 2017